- Title
- MON-535 Deep-Machine Learning for Objective Quantification of Nerves in Immunohistochemistry Specimens of Thyroid Cancer
- Creator
- Astono, Idriani; Rowe, Christopher W.; Welsh, James; Jobling, Phillip
- Relation
- Journal of the Endocrine Society Vol. 4, Issue Supplement_1, no. MON-535
- Publisher Link
- http://dx.doi.org/10.1210/jendso/bvaa046.1335
- Publisher
- Oxford University Press
- Resource Type
- journal article
- Date
- 2020
- Description
- Introduction: Nerves in the cancer microenvironment have prognostic significance, and nerve-cancer crosstalk may contribute to tumour progression, but the role of nerves in thyroid cancer is not known (1). Reproducible techniques to quantify innervation are lacking, with reliance on manual counting or basic single-parameter digital quantification. Aims: To determine if a deep machine learning algorithm could objectively quantify nerves in a digital histological dataset of thyroid cancers immunostained for the specific pan-neuronal marker PGP9.5. Methods: A training dataset of 30 digitised papillary thyroid cancer immunohistochemistry slides were manually screened for PGP9.5 positive nerves, annotated using QuPath (2). 1500 true positive nerves were identified. This dataset was used to train the deep-learning algorithm. First, a colour filter identified pixels positive for PGP9.5 (Model 1). Then, a manually tuned colour filter and clustering method identified Regions of Interest (ROIs): clusters of PGP9.5 positive pixels that may represent nerves (Model 2). These ROIs were classified by the deep learning model (Model 3), based on a Convolutional Neural Network with approximately 2.7 million trainable parameters. The full model was run on a testing dataset of thyroid cancer slides (n=5), containing 7-35 manually identified nerves per slide. Model predictions were validated by human assessment of a random subset of 100 ROIs. The code was written in Python and the model was developed in Keras. Results: Model 2 (colour filter + clustering only) identified median 2247 ROIs per slide (range 349-4748), which included 94% of the manually identified nerves. However, most Model 2 ROIs were false positives (FP) (median 85% FP, range 68-95%), indicating that Model 2 was sensitive but poorly specific for nerve identification. Model 3 (deep learning) identified fewer ROIs per slide (median 1068, range 150-3091), but still correctly identified 94% of manually annotated nerves. Of the additionally detected ROIs in Model 3, median FP rate was 35%. However, in slides where higher non-specific immunostaining was present, then the number of FP ROIs was >90%. Conclusion: Simple image analysis based on colour filtration/cluster analysis does not accurately identify immunohistochemically labelled nerves in thyroid cancers. Addition of deep-learning improves sensitivity with acceptable specificity, and significantly increases the number of true positive nerves detected compared to manual counting. However, the current deep learning model lacks specificity in the setting of non-specific immunostaining, which is a basis for improving further iterations of this model to facilitate study of the significance of innervation of thyroid and other cancers.
- Subject
- cancer microenvironment; thryoid; positive nerves; immunostaining
- Identifier
- http://hdl.handle.net/1959.13/1478188
- Identifier
- uon:50131
- Identifier
- ISSN:2472-1972
- Rights
- This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial reproduction and distribution of the work, in any medium, provided the original work is not altered or transformed in any way, and that the work is properly cited. For commercial re-use, please contact journals.permissions@oup.com
- Language
- eng
- Full Text
- Reviewed
- Hits: 1090
- Visitors: 1168
- Downloads: 82
Thumbnail | File | Description | Size | Format | |||
---|---|---|---|---|---|---|---|
View Details Download | ATTACHMENT02 | Publisher version (open access) | 110 KB | Adobe Acrobat PDF | View Details Download |